Measuring The VC - dimension Using OptimizedExperimental
نویسندگان
چکیده
VC-dimension is the measure of model complexity (capacity) used in VC-theory. The knowledge of the VC-dimension of an estima-2 tor is necessary for rigorous complexity control using analytic VC generalization bounds. Unfortunately, it is not possible to obtain the analytic estimates of the VC-dimension in most cases. Hence, it has been recently proposed to measure the VC-dimension of an estimator experimentally by tting the theoretical formula to a set of experimental measurements of the frequency of errors on arti-cially generated data sets of varying sizes (Vapnik et al, 1994). However, it may be diicult to obtain an accurate estimate of the VC-dimension due to variability of random samples in the experimental procedure proposed by (Vapnik et al, 1994). We address this problem by proposing an improved design procedure for specifying the measurement points (i.e., the sample size and the number of repeated experiments at a given sample size). Our approach leads to non-uniform design structure as opposed to the uniform design structure used in the original paper (Vapnik et al, 1994). Our simulation results show that the proposed optimized design structure leads to a more accurate estimation of the VC-dimension via the experimental procedure. Likewise, our results show that more accurate estimation of VC-dimension leads to improved complexity control via analytic VC-generalization bounds and hence better prediction accuracy.
منابع مشابه
On the VC-Dimension of Univariate Decision Trees
In this paper, we give and prove lower bounds of the VC-dimension of the univariate decision tree hypothesis class. The VC-dimension of the univariate decision tree depends on the VC-dimension values of its subtrees and the number of inputs. In our previous work (Aslan et al., 2009), we proposed a search algorithm that calculates the VC-dimension of univariate decision trees exhaustively. Using...
متن کاملError Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions
The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...
متن کاملVC Dimension Bounds for Higher-Order Neurons
We investigate the sample complexity for learning using higher-order neurons. We calculate upper and lower bounds on the Vapnik-Chervonenkis dimension and the pseudo dimension for higher-order neurons that allow unrestricted interactions among the input variables. In particular, we show that the degree of interaction is irrelevant for the VC dimension and that the individual degree of the varia...
متن کاملLearning a hyperplane classifier by minimizing an exact bound on the VC dimension
The VC dimension measures the complexity of a learning machine, and a low VC dimension leads to good generalization. While SVMs produce state-of-the-art learning performance, it is well known that the VC dimension of a SVM can be unbounded; despite good results in practice, there is no guarantee of good generalization. In this paper, we show how to learn a hyperplane classifier by minimizing an...
متن کاملDeciding the Vapnik-Cervonenkis dimension is ...-complete
Linial et al. raised the question of how diicult the computation of the Vapnik-ervonenkis dimension of a concept class over a nite universe is. Papadmimitriou and Yannakakis obtained a rst answer using matrix representations of concept classes. We choose a more natural representation , which leads us to redeene the vc dimension problem. We establish that vc dimension is p 3-complete, thereby gi...
متن کامل